restricted training set
Dynamics of Supervised Learning with Restricted Training Sets and Noisy Teachers
We generalize a recent formalism to describe the dynamics of supervised learning in layered neural networks, in the regime where data recycling is inevitable, to the case of noisy teachers. Our theory generates reliable predictions for the evolution in time of training- and generalization er(cid:173) rors, and extends the class of mathematically solvable learning processes in large neural networks to those situations where overfitting can occur.
Dynamics of Supervised Learning with Restricted Training Sets
We study the dynamics of supervised learning in layered neural net(cid:173) works, in the regime where the size p of the training set is proportional to the number N of inputs. Here the local fields are no longer described by Gaussian distributions. We use dynamical replica theory to predict the evolution of macroscopic observables, including the relevant error measures, incorporating the old formalism in the limit piN --t 00.
On-Line Learning with Restricted Training Sets: Exact Solution as Benchmark for General Theories
O(ws(s log d log(dqh/ s))) and O(ws((h/ s) log q) log(dqh/ s)) are upper bounds for the VC-dimension of a set of neural networks of units with piecewise polynomial activation functions, where s is the depth of the network, h is the number of hidden units, w is the number of adjustable parameters, q is the maximum of the number of polynomial segments of the activation function, and d is the maximum degree of the polynomials; also n(wslog(dqh/s)) is a lower bound for the VC-dimension of such a network set, which are tight for the cases s 8(h) and s is constant. For the special case q 1, the VC-dimension is 8(ws log d).
Dynamics of Supervised Learning with Restricted Training Sets and Noisy Teachers
Coolen, Anthony C. C., Mace, C. W. H.
We generalize a recent formalism to describe the dynamics of supervised learning in layered neural networks, in the regime where data recycling is inevitable, to the case of noisy teachers. Our theory generates reliable predictions for the evolution in time of training-and generalization errors, and extends the class of mathematically solvable learning processes in large neural networks to those situations where overfitting can occur.
Dynamics of Supervised Learning with Restricted Training Sets and Noisy Teachers
Coolen, Anthony C. C., Mace, C. W. H.
We generalize a recent formalism to describe the dynamics of supervised learning in layered neural networks, in the regime where data recycling is inevitable, to the case of noisy teachers. Our theory generates reliable predictions for the evolution in time of training-and generalization errors, and extends the class of mathematically solvable learning processes in large neural networks to those situations where overfitting can occur.
Dynamics of Supervised Learning with Restricted Training Sets and Noisy Teachers
Coolen, Anthony C. C., Mace, C. W. H.
We generalize a recent formalism to describe the dynamics of supervised learning in layered neural networks, in the regime where data recycling is inevitable, to the case of noisy teachers. Our theory generates reliable predictions for the evolution in time of training-and generalization errors, andextends the class of mathematically solvable learning processes in large neural networks to those situations where overfitting can occur.
- Europe > United Kingdom (0.04)
- Asia > Singapore (0.04)
- Asia > Japan (0.04)
- Asia > Middle East > Jordan (0.04)
- Asia > Japan (0.04)
- Asia > Middle East > Jordan (0.04)
- Europe > United Kingdom (0.04)
- Asia > Singapore (0.04)